منابع مشابه
Recursive Principal Components Analysis
Principal components analysis is an important and well-studied subject in statistics and signal processing. The literature has an abundance of algorithms for solving this problem, where most of these algorithms could be grouped into one of the following three approaches: adaptation based on Hebbian updates and deflation, optimization of a second order statistical criterion (like reconstruction ...
متن کاملRecursive principal components analysis
A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal st...
متن کاملOnline (Recursive) Robust Principal Components Analysis
This work studies the problem of sequentially recovering a sparse vector St and a vector from a low-dimensional subspace Lt from knowledge of their sum Mt := Lt + St. If the primary goal is to recover the low-dimensional subspace in which the Lt’s lie, then the problem is one of online or recursive robust principal components analysis (PCA). An example of where such a problem might arise is in ...
متن کاملPersian Handwriting Analysis Using Functional Principal Components
Principal components analysis is a well-known statistical method in dealing with large dependent data sets. It is also used in functional data for both purposes of data reduction as well as variation representation. On the other hand "handwriting" is one of the objects, studied in various statistical fields like pattern recognition and shape analysis. Considering time as the argument,...
متن کاملOnline Principal Components Analysis
We consider the online version of the well known Principal Component Analysis (PCA) problem. In standard PCA, the input to the problem is a set of ddimensional vectors X = [x1, . . . ,xn] and a target dimension k < d; the output is a set of k-dimensional vectors Y = [y1, . . . ,yn] that minimize the reconstruction error: minΦ ∑ i ‖xi − Φyi‖2. Here, Φ ∈ Rd×k is restricted to being isometric. The...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neural Networks
سال: 2005
ISSN: 0893-6080
DOI: 10.1016/j.neunet.2005.07.005